Search results for "Statistics - Computation"

showing 10 items of 41 documents

Coupled conditional backward sampling particle filter

2020

The conditional particle filter (CPF) is a promising algorithm for general hidden Markov model smoothing. Empirical evidence suggests that the variant of CPF with backward sampling (CBPF) performs well even with long time series. Previous theoretical results have not been able to demonstrate the improvement brought by backward sampling, whereas we provide rates showing that CBPF can remain effective with a fixed number of particles independent of the time horizon. Our result is based on analysis of a new coupling of two CBPFs, the coupled conditional backward sampling particle filter (CCBPF). We show that CCBPF has good stability properties in the sense that with fixed number of particles, …

65C05FOS: Computer and information sciencesStatistics and ProbabilityunbiasedMarkovin ketjutTime horizonStatistics - Computation01 natural sciencesStability (probability)backward sampling65C05 (Primary) 60J05 65C35 65C40 (secondary)010104 statistics & probabilityconvergence rateFOS: MathematicsApplied mathematics0101 mathematicscouplingHidden Markov model65C35Computation (stat.CO)Mathematicsstokastiset prosessitBackward samplingSeries (mathematics)Probability (math.PR)Sampling (statistics)conditional particle filterMonte Carlo -menetelmätRate of convergence65C6065C40numeerinen analyysiStatistics Probability and UncertaintyParticle filterMathematics - ProbabilitySmoothing
researchProduct

Array programming with NumPy.

2020

Array programming provides a powerful, compact and expressive syntax for accessing, manipulating and operating on data in vectors, matrices and higher-dimensional arrays. NumPy is the primary array programming library for the Python language. It has an essential role in research analysis pipelines in fields as diverse as physics, chemistry, astronomy, geoscience, biology, psychology, materials science, engineering, finance and economics. For example, in astronomy, NumPy was an important part of the software stack used in the discovery of gravitational waves1 and in the first imaging of a black hole2. Here we review how a few fundamental array concepts lead to a simple and powerful programmi…

FOS: Computer and information sciences/639/705/1042Computer science/639/705/794Interoperability/639/705/117Review ArticleStatistics - Computationohjelmointikielet01 natural sciences03 medical and health sciencesSoftwareSoftware Designlaskennallinen tiede0103 physical sciencesFOS: Mathematics010303 astronomy & astrophysicsComputation (stat.CO)030304 developmental biologycomputer.programming_languageSolar physics0303 health sciencesMultidisciplinaryApplication programming interfacebusiness.industryNumPyComputational sciencereview-articleComputational BiologyPython (programming language)Computer science/704/525/870Computational neuroscienceProgramming paradigmSoftware designComputer Science - Mathematical Software/631/378/116/139Programming LanguagesArray programmingohjelmistokirjastotSoftware engineeringbusinessMathematical Software (cs.MS)computerMathematicsSoftwarePythonNature
researchProduct

On the use of approximate Bayesian computation Markov chain Monte Carlo with inflated tolerance and post-correction

2020

Approximate Bayesian computation allows for inference of complicated probabilistic models with intractable likelihoods using model simulations. The Markov chain Monte Carlo implementation of approximate Bayesian computation is often sensitive to the tolerance parameter: low tolerance leads to poor mixing and large tolerance entails excess bias. We consider an approach using a relatively large tolerance for the Markov chain Monte Carlo sampler to ensure its sufficient mixing, and post-processing the output leading to estimators for a range of finer tolerances. We introduce an approximate confidence interval for the related post-corrected estimators, and propose an adaptive approximate Bayesi…

FOS: Computer and information sciences0301 basic medicineStatistics and Probabilitytolerance choiceGeneral MathematicsMarkovin ketjutInference01 natural sciencesStatistics - Computationapproximate Bayesian computation010104 statistics & probability03 medical and health sciencessymbols.namesakeMixing (mathematics)adaptive algorithmalgoritmit0101 mathematicsComputation (stat.CO)MathematicsAdaptive algorithmMarkov chainbayesilainen menetelmäApplied MathematicsProbabilistic logicEstimatorMarkov chain Monte CarloAgricultural and Biological Sciences (miscellaneous)Markov chain Monte CarloMonte Carlo -menetelmätimportance sampling030104 developmental biologyconfidence intervalsymbolsStatistics Probability and UncertaintyApproximate Bayesian computationGeneral Agricultural and Biological SciencesAlgorithm
researchProduct

Group Importance Sampling for particle filtering and MCMC

2018

Bayesian methods and their implementations by means of sophisticated Monte Carlo techniques have become very popular in signal processing over the last years. Importance Sampling (IS) is a well-known Monte Carlo technique that approximates integrals involving a posterior distribution by means of weighted samples. In this work, we study the assignation of a single weighted sample which compresses the information contained in a population of weighted samples. Part of the theory that we present as Group Importance Sampling (GIS) has been employed implicitly in different works in the literature. The provided analysis yields several theoretical and practical consequences. For instance, we discus…

FOS: Computer and information sciencesComputer Science - Machine LearningComputer sciencePosterior probabilityMonte Carlo methodMachine Learning (stat.ML)02 engineering and technologyMultiple-try MetropolisStatistics - Computation01 natural sciencesMachine Learning (cs.LG)Computational Engineering Finance and Science (cs.CE)Methodology (stat.ME)010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingStatistics - Machine LearningArtificial IntelligenceResampling0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputer Science - Computational Engineering Finance and ScienceStatistics - MethodologyComputation (stat.CO)ComputingMilieux_MISCELLANEOUSMarkov chainApplied Mathematics020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputational Theory and MathematicsSignal ProcessingsymbolsComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingAlgorithmImportance samplingDigital Signal Processing
researchProduct

Deep Importance Sampling based on Regression for Model Inversion and Emulation

2021

Understanding systems by forward and inverse modeling is a recurrent topic of research in many domains of science and engineering. In this context, Monte Carlo methods have been widely used as powerful tools for numerical inference and optimization. They require the choice of a suitable proposal density that is crucial for their performance. For this reason, several adaptive importance sampling (AIS) schemes have been proposed in the literature. We here present an AIS framework called Regression-based Adaptive Deep Importance Sampling (RADIS). In RADIS, the key idea is the adaptive construction via regression of a non-parametric proposal density (i.e., an emulator), which mimics the posteri…

FOS: Computer and information sciencesComputer Science - Machine LearningImportance samplingComputer scienceMonte Carlo methodPosterior probabilityBayesian inferenceInferenceContext (language use)Machine Learning (stat.ML)02 engineering and technologyEstadísticaStatistics - ComputationMachine Learning (cs.LG)symbols.namesakeSurrogate modelStatistics - Machine LearningArtificial Intelligence0202 electrical engineering electronic engineering information engineeringAdaptive regressionEmulationElectrical and Electronic EngineeringModel inversionGaussian processComputation (stat.CO)EmulationApplied Mathematics020206 networking & telecommunicationsRemote sensingComputational Theory and MathematicsSignal Processingsymbols020201 artificial intelligence & image processingComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyAlgorithmImportance sampling
researchProduct

Compressed Particle Methods for Expensive Models With Application in Astronomy and Remote Sensing

2021

In many inference problems, the evaluation of complex and costly models is often required. In this context, Bayesian methods have become very popular in several fields over the last years, in order to obtain parameter inversion, model selection or uncertainty quantification. Bayesian inference requires the approximation of complicated integrals involving (often costly) posterior distributions. Generally, this approximation is obtained by means of Monte Carlo (MC) methods. In order to reduce the computational cost of the corresponding technique, surrogate models (also called emulators) are often employed. Another alternative approach is the so-called Approximate Bayesian Computation (ABC) sc…

FOS: Computer and information sciencesComputer scienceAstronomyModel selectionBayesian inferenceMonte Carlo methodBayesian probabilityAerospace EngineeringAstronomyInferenceMachine Learning (stat.ML)Context (language use)Bayesian inferenceStatistics - ComputationComputational Engineering Finance and Science (cs.CE)remote sensingimportance samplingStatistics - Machine Learningnumerical inversionparticle filteringElectrical and Electronic EngineeringUncertainty quantificationApproximate Bayesian computationComputer Science - Computational Engineering Finance and ScienceComputation (stat.CO)IEEE Transactions on Aerospace and Electronic Systems
researchProduct

A Review of Multiple Try MCMC algorithms for Signal Processing

2018

Many applications in signal processing require the estimation of some parameters of interest given a set of observed data. More specifically, Bayesian inference needs the computation of {\it a-posteriori} estimators which are often expressed as complicated multi-dimensional integrals. Unfortunately, analytical expressions for these estimators cannot be found in most real-world applications, and Monte Carlo methods are the only feasible approach. A very powerful class of Monte Carlo techniques is formed by the Markov Chain Monte Carlo (MCMC) algorithms. They generate a Markov chain such that its stationary distribution coincides with the target posterior density. In this work, we perform a t…

FOS: Computer and information sciencesComputer scienceMonte Carlo methodMachine Learning (stat.ML)02 engineering and technologyMultiple-try MetropolisBayesian inference01 natural sciencesStatistics - Computation010104 statistics & probabilitysymbols.namesakeArtificial IntelligenceStatistics - Machine Learning0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputation (stat.CO)Signal processingMarkov chainApplied MathematicsEstimator020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputational Theory and MathematicsSignal ProcessingsymbolsSample spaceComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyAlgorithm
researchProduct

Open Data Quality Evaluation: A Comparative Analysis of Open Data in Latvia

2020

Nowadays open data is entering the mainstream - it is free available for every stakeholder and is often used in business decision-making. It is important to be sure data is trustable and error-free as its quality problems can lead to huge losses. The research discusses how (open) data quality could be assessed. It also covers main points which should be considered developing a data quality management solution. One specific approach is applied to several Latvian open data sets. The research provides a step-by-step open data sets analysis guide and summarizes its results. It is also shown there could exist differences in data quality depending on data supplier (centralized and decentralized d…

FOS: Computer and information sciencesGeneral Computer ScienceComputer sciencemedia_common.quotation_subjectStakeholderLatvianDatabases (cs.DB)Statistics - ApplicationsStatistics - Computationlanguage.human_languageComputer Science - Information RetrievalComputer Science - Computers and SocietyOpen dataLead (geology)Computer Science - DatabasesRisk analysis (engineering)Data qualityComputers and Society (cs.CY)languageMainstreamQuality (business)Applications (stat.AP)Information Retrieval (cs.IR)Computation (stat.CO)media_common
researchProduct

On resampling schemes for particle filters with weakly informative observations

2022

We consider particle filters with weakly informative observations (or `potentials') relative to the latent state dynamics. The particular focus of this work is on particle filters to approximate time-discretisations of continuous-time Feynman--Kac path integral models -- a scenario that naturally arises when addressing filtering and smoothing problems in continuous time -- but our findings are indicative about weakly informative settings beyond this context too. We study the performance of different resampling schemes, such as systematic resampling, SSP (Srinivasan sampling process) and stratified resampling, as the time-discretisation becomes finer and also identify their continuous-time l…

FOS: Computer and information sciencesHidden Markov modelparticle filterStatistics and ProbabilityProbability (math.PR)Markovin ketjutStatistics - ComputationMethodology (stat.ME)resamplingFOS: Mathematicsotantanumeerinen analyysiPrimary 65C35 secondary 65C05 65C60 60J25Statistics Probability and UncertaintyFeynman–Kac modeltilastolliset mallitComputation (stat.CO)path integralMathematics - ProbabilityStatistics - Methodologystokastiset prosessit
researchProduct

Metropolis Sampling

2017

Monte Carlo (MC) sampling methods are widely applied in Bayesian inference, system simulation and optimization problems. The Markov Chain Monte Carlo (MCMC) algorithms are a well-known class of MC methods which generate a Markov chain with the desired invariant distribution. In this document, we focus on the Metropolis-Hastings (MH) sampler, which can be considered as the atom of the MCMC techniques, introducing the basic notions and different properties. We describe in details all the elements involved in the MH algorithm and the most relevant variants. Several improvements and recent extensions proposed in the literature are also briefly discussed, providing a quick but exhaustive overvie…

FOS: Computer and information sciencesMachine Learning (stat.ML)020206 networking & telecommunications02 engineering and technologyStatistics - Computation01 natural sciencesStatistics::ComputationMethodology (stat.ME)010104 statistics & probabilityStatistics - Machine Learning0202 electrical engineering electronic engineering information engineering0101 mathematicsComputation (stat.CO)Statistics - MethodologyWiley StatsRef: Statistics Reference Online
researchProduct